On Oracle Inequalities Related to High Dimensional Linear Models

نویسنده

  • YURI GOLUBEV
چکیده

Abstract. We consider the problem of estimating an unknown vector θ from the noisy data Y = Aθ + ǫ, where A is a known m × n matrix and ǫ is a white Gaussian noise. It is assumed that n is large and A is ill-posed. Therefore in order to estimate θ, a spectral regularization method is used and our goal is to choose a spectral regularization parameter with the help of the data Y . We study data-driven regularization methods based on the empirical risk minimization principle and provide some new oracle inequalities related to this approach.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival...

متن کامل

Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space

High-dimensional data analysis has motivated a spectrum of regularization methods for variable selection and sparse modeling, with two popular methods being convex and concave ones. A long debate has taken place on whether one class dominates the other, an important question both in theory and to practitioners. In this article, we characterize the asymptotic equivalence of regularization method...

متن کامل

h . ST ] 3 M ay 2 01 4 Censored linear model in high dimensions

Censored data are quite common in statistics and have been studied in depth in the last years (for some early references, see Powell (1984), Muphy et al. (1999), Chay and Powell (2001)). In this paper we consider censored high-dimensional data. High-dimensional models are in some way more complex than their lowdimensional versions, therefore some different techniques are required. For the linea...

متن کامل

Sparse oracle inequalities for variable selection via regularized quantization

We give oracle inequalities on procedures which combines quantization and variable selection via a weighted Lasso k-means type algorithm. The results are derived for a general family of weights, which can be tuned to size the influence of the variables in different ways. Moreover, these theoretical guarantees are proved to adapt the corresponding sparsity of the optimal codebooks, suggesting th...

متن کامل

Group Lasso for generalized linear models in high dimension

We present a Group Lasso procedure for generalized linear models (GLMs) and we study the properties of this estimator applied to sparse high-dimensional GLMs. Under general conditions on the joint distribution of the pair observable covariates, we provide oracle inequalities promoting group sparsity of the covariables. We get convergence rates for the prediction and estimation error and we show...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007